Rosenbrock methods may refer to either of two distinct ideas in numerical computation, both named for Howard H. Rosenbrock. Rosenbrock optimization methods are a family of numerical optimization algorithms applicable to optimization problems in which the objective function is inexpensive to compute yet and the explicit derivative cannot be computed efficiently.[1] Rosenbrock methods for stiff differential equations are methods for solving ordinary differential equations that contain a wide range of characteristic timescales.[2]
Rosenbrock optimization methods are related to Nelder-Mead methods, but with better convergence properties.
|